Cox's Theorem
   HOME

TheInfoList



OR:

Cox's theorem, named after the physicist Richard Threlkeld Cox, is a derivation of the laws of probability theory from a certain set of postulates. This derivation justifies the so-called "logical" interpretation of probability, as the laws of probability derived by Cox's theorem are applicable to any proposition. Logical (also known as objective Bayesian) probability is a type of Bayesian probability. Other forms of Bayesianism, such as the subjective interpretation, are given other justifications.


Cox's assumptions

Cox wanted his system to satisfy the following conditions: #Divisibility and comparability – The plausibility of a proposition is a real number and is dependent on information we have related to the proposition. #Common sense – Plausibilities should vary sensibly with the assessment of plausibilities in the model. #Consistency – If the plausibility of a proposition can be derived in many ways, all the results must be equal. The postulates as stated here are taken from Arnborg and Sjödin.Stefan Arnborg and Gunnar Sjödin, ''On the foundations of Bayesianism,'' Preprint: Nada, KTH (1999) — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/06arnborg.ps — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/06arnborg.pdfStefan Arnborg and Gunnar Sjödin, ''A note on the foundations of Bayesianism,'' Preprint: Nada, KTH (2000a) — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobshle.ps — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobshle.pdfStefan Arnborg and Gunnar Sjödin, "Bayes rules in finite models," in ''European Conference on Artificial Intelligence,'' Berlin, (2000b) — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobc1.ps — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobc1.pdf " Common sense" includes consistency with Aristotelian logic in the sense that logically equivalent propositions shall have the same plausibility. The postulates as originally stated by Cox were not mathematically rigorous (although more so than the informal description above), as noted by Halpern.Joseph Y. Halpern, "A counterexample to theorems of Cox and Fine," ''Journal of AI research,'' 10, 67–85 (1999) — http://www.jair.org/media/536/live-536-2054-jair.ps.Z Joseph Y. Halpern, "Technical Addendum, Cox's theorem Revisited," ''Journal of AI research,'' 11, 429–435 (1999) — http://www.jair.org/media/644/live-644-1840-jair.ps.Z However it appears to be possible to augment them with various mathematical assumptions made either implicitly or explicitly by Cox to produce a valid proof. Cox's notation: :The plausibility of a proposition A given some related information X is denoted by A\mid X. Cox's postulates and functional equations are: *The plausibility of the conjunction AB of two propositions A, B, given some related information X, is determined by the plausibility of A given X and that of B given AX. :In form of a functional equation ::AB\mid X=g(A\mid X,B\mid AX) :Because of the associative nature of the conjunction in propositional logic, the consistency with logic gives a functional equation saying that the function g is an
associative In mathematics, the associative property is a property of some binary operations, which means that rearranging the parentheses in an expression will not change the result. In propositional logic, associativity is a valid rule of replacement f ...
binary operation. *Additionally, Cox postulates the function g to be monotonic. :All strictly increasing associative binary operations on the real numbers are isomorphic to multiplication of numbers in a subinterval of , which means that there is a monotonic function w mapping plausibilities to such that ::w(AB\mid X)=w(A\mid X)w(B\mid AX) *In case A given X is certain, we have AB\mid X=B\mid X and B\mid AX=B\mid X due to the requirement of consistency. The general equation then leads to :w(B\mid X)=w(A\mid X)w(B\mid X) :This shall hold for any proposition B, which leads to ::w(A\mid X)=1 *In case A given X is impossible, we have AB\mid X=A\mid X and A\mid BX=A\mid X due to the requirement of consistency. The general equation (with the A and B factors switched) then leads to :w(A\mid X)=w(B\mid X)w(A\mid X) :This shall hold for any proposition B, which, without loss of generality, leads to a solution ::w(A\mid X)=0 ::Due to the requirement of monotonicity, this means that w maps plausibilities to interval . *The plausibility of a proposition determines the plausibility of the proposition's
negation In logic, negation, also called the logical complement, is an operation that takes a proposition P to another proposition "not P", written \neg P, \mathord P or \overline. It is interpreted intuitively as being true when P is false, and false ...
. :This postulates the existence of a function f such that ::w(\text A\mid X)=f(w(A\mid X)) :Because "a double negative is an affirmative", consistency with logic gives a functional equation ::f(f(x))=x, :saying that the function f is an involution, i.e., it is its own inverse. *Furthermore, Cox postulates the function f to be monotonic. :The above functional equations and consistency with logic imply that ::w(AB\mid X)=w(A\mid X)f(w(\textB\mid AX))=w(A\mid X)f\left( \right) :Since AB is logically equivalent to BA, we also get ::w(A\mid X)f\left( \right)=w(B\mid X)f\left( \right) :If, in particular, B=\text(AD), then also A\text B = \textB and B\textA=\textA and we get ::w(A\textB\mid X)=w(\textB\mid X)=f(w(B\mid X)) :and ::w(B\textA\mid X)=w(\textA\mid X)=f(w(A\mid X)) :Abbreviating w(A\mid X)=x and w(B\mid X)=y we get the functional equation ::x\,f\left(\right)=y\,f\left(\right)


Implications of Cox's postulates

The laws of probability derivable from these postulates are the following. Edwin Thompson Jaynes, ''Probability Theory: The Logic of Science,'' Cambridge University Press (2003). — preprint version (1996) at ; Chapters 1 to 3 of published version at http://bayes.wustl.edu/etj/prob/book.pdf Let A\mid B be the plausibility of the proposition A given B satisfying Cox's postulates. Then there is a function w mapping plausibilities to interval ,1and a positive number m such that # Certainty is represented by w(A\mid B)=1. # w^m(A, B)+w^m(\textA\mid B)=1. # w(AB\mid C)=w(A\mid C)w(B\mid AC)=w(B\mid C)w(A\mid BC). It is important to note that the postulates imply only these general properties. We may recover the usual laws of probability by setting a new function, conventionally denoted P or \Pr, equal to w^m. Then we obtain the laws of probability in a more familiar form: # Certain truth is represented by \Pr(A\mid B)=1, and certain falsehood by \Pr(A\mid B)=0. # \Pr(A\mid B)+\Pr(\textA\mid B)=1. # \Pr(AB\mid C)=\Pr(A\mid C)\Pr(B\mid AC)=\Pr(B\mid C)\Pr(A\mid BC). Rule 2 is a rule for negation, and rule 3 is a rule for conjunction. Given that any proposition containing conjunction, disjunction, and negation can be equivalently rephrased using conjunction and negation alone (the conjunctive normal form), we can now handle any compound proposition. The laws thus derived yield
finite additivity In mathematics, an additive set function is a function mapping sets to numbers, with the property that its value on a union of two disjoint sets equals the sum of its values on these sets, namely, \mu(A \cup B) = \mu(A) + \mu(B). If this additivity ...
of probability, but not
countable additivity In mathematics, an additive set function is a function mapping sets to numbers, with the property that its value on a Union (set theory), union of two disjoint sets equals the sum of its values on these sets, namely, \mu(A \cup B) = \mu(A) + \mu(B) ...
. The measure-theoretic formulation of Kolmogorov assumes that a probability measure is countably additive. This slightly stronger condition is necessary for the proof of certain theorems.


Interpretation and further discussion

Cox's theorem has come to be used as one of the
justification Justification may refer to: * Justification (epistemology), a property of beliefs that a person has good reasons for holding * Justification (jurisprudence), defence in a prosecution for a criminal offenses * Justification (theology), God's act of ...
s for the use of Bayesian probability theory. For example, in Jaynes it is discussed in detail in chapters 1 and 2 and is a cornerstone for the rest of the book. Probability is interpreted as a formal system of logic, the natural extension of
Aristotelian logic In philosophy, term logic, also known as traditional logic, syllogistic logic or Aristotelian logic, is a loose name for an approach to formal logic that began with Aristotle and was developed further in ancient history mostly by his followers, t ...
(in which every statement is either true or false) into the realm of reasoning in the presence of uncertainty. It has been debated to what degree the theorem excludes alternative models for reasoning about uncertainty. For example, if certain "unintuitive" mathematical assumptions were dropped then alternatives could be devised, e.g., an example provided by Halpern. However Arnborg and Sjödin suggest additional "common sense" postulates, which would allow the assumptions to be relaxed in some cases while still ruling out the Halpern example. Other approaches were devised by Hardy or Dupré and Tipler.Dupré, Maurice J. & Tipler, Frank J. (2009)
"New Axioms for Rigorous Bayesian Probability"
''Bayesian Analysis'', 4(3): 599-606.
The original formulation of Cox's theorem is in , which is extended with additional results and more discussion in . Jaynes cites Abel for the first known use of the associativity functional equation. János Aczél provides a long proof of the "associativity equation" (pages 256-267). Jaynes reproduces the shorter proof by Cox in which differentiability is assumed. A guide to Cox's theorem by Van Horn aims at comprehensively introducing the reader to all these references.


See also

* Probability axioms * Probability logic


References


Further reading

* *{{cite book , first=C. Ray , last=Smith , first2=Gary , last2=Erickson , chapter=From Rationality and Consistency to Bayesian Probability , pages=29–44 , title=Maximum Entropy and Bayesian Methods , editor-first=John , editor-last=Skilling , location=Dordrecht , publisher=Kluwer , year=1989 , isbn=0-7923-0224-9 , doi=10.1007/978-94-015-7860-8_2 Probability theorems Probability interpretations Theorems in statistics